Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
BackgroundPrescribed fire is vital for fuel reduction and ecological restoration, but the effectiveness and fine-scale interactions are poorly understood. AimsWe developed methods for processing uncrewed aircraft systems (UAS) imagery into spatially explicit pyrometrics, including measurements of fuel consumption, rate of spread, and residence time to quantitatively measure three prescribed fires. MethodsWe collected infrared (IR) imagery continuously (0.2 Hz) over prescribed burns and one experimental calibration burn, capturing fire progression and combustion for multiple hours. Key resultsPyrometrics were successfully extracted from UAS-IR imagery with sufficient spatiotemporal resolution to effectively measure and differentiate between fires. UAS-IR fuel consumption correlated with weight-based measurements of 10 1-m2 experimental burn plots, validating our approach to estimating consumption with a cost-effective UAS-IR sensor (R2 = 0.99; RMSE = 0.38 kg m-2). ConclusionsOur findings demonstrate UAS-IR pyrometrics are an accurate approach to monitoring fire behaviour and effects, such as measurements of consumption. Prescribed fire is a fine-scale process; a ground sampling distance of <2.3 m2 is recommended. Additional research is needed to validate other derived measurements. ImplicationsRefined fire monitoring coupled with refined objectives will be pivotal in informing fire management of best practices, justifying the use of prescribed fire and providing quantitative feedback in an uncertain environment.more » « less
-
FLAME 3 is the third dataset in the FLAME series of aerial UAV-collected side-by-side multi-spectral wildlands fire imagery (see FLAME 1 and FLAME 2). This set contains a single-burn subset of the larger FLAME 3 dataset focusing specifically on Computer Vision tasks such as fire detection and segmentation. Included are 622 image quartets labeled Fire and 116 image quartets labeled No Fire. The No Fire images are of the surrounding forestry of the prescribed burn plot. Each image quartet is composed of four images - a raw RGB image, a raw thermal image, a corrected FOV RGB image, and a thermal TIFF. Each of the four data types are detailed in the below Table 1. More information on data collection methods, data processing procedures, and data labeling can be found in https://arxiv.org/abs/2412.02831. This dataset also contains a NADIR Thermal Fire set, providing georeferenced overhead thermal imagery, captured by UAV every 3-5 seconds, focusing on monitoring fire progression and burn behaviors over time. This data, when processed, enables centimeter-grade measurements of fire spread and energy release over time. Pre, post, and during burn imagery are included, along with ground control point (GCP) data. This dataset is based on the research conducted in the paper: FLAME 3 Dataset: Unleashing the Power of Radiometric Thermal UAV Imagery for Wildfire Management. It provides detailed insights and analysis related to forest fire monitoring and modeling.If you use this dataset in your research or projects, please cite the original paper as follows: APA: Hopkins, B., ONeill, L., Marinaccio, M., Rowell, E., Parsons, R., Flanary, S., Nazim I, Seielstad C, Afghah, F. (2024). FLAME 3 Dataset: Unleashing the Power of Radiometric Thermal UAV Imagery for Wildfire Management. arXiv preprint arXiv:2412.02831.BibTeX: @misc{hopkins2024flame3datasetunleashing, title={FLAME 3 Dataset: Unleashing the Power of Radiometric Thermal UAV Imagery for Wildfire Management}, author={Bryce Hopkins and Leo ONeill and Michael Marinaccio and Eric Rowell and Russell Parsons and Sarah Flanary and Irtija Nazim and Carl Seielstad and Fatemeh Afghah}, year={2024}, eprint={2412.02831}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2412.02831}, }more » « less
-
Drone based wildfire detection and modeling methods enable high-precision, real-time fire monitoring that is not provided by traditional remote fire monitoring systems, such as satellite imaging. Precise, real-time information enables rapid, effective wildfire intervention and management strategies. Drone systems’ ease of deployment, omnidirectional maneuverability, and robust sensing capabilities make them effective tools for early wildfire detection and evaluation, particularly so in environments that are inconvenient for humans and/or terrestrial vehicles. Development of emerging drone-based fire monitoring systems has been inhibited by a lack of well-annotated, high quality aerial wildfire datasets, largely as a result of UAV flight regulations for prescribed burns and wildfires. The included dataset provides a collection of side-by-side infrared and visible spectrum video pairs taken by drones during an open canopy prescribed fire in Northern Arizona in 2021. The frames have been classified by two independent classifiers with two binary classifications. The Fire label is applied when the classifiers visually observe indications of fire in either RGB or IR frame for each frame pair. The Smoke label is applied when the classifiers visually estimate that at least 50% of the RGB frame is filled with smoke. To provide additional context to the main dataset’s aerial imagery, the provided supplementary dataset includes weather information, the prescribed burn plan, a geo-referenced RGB point cloud of the preburn area, an RGB orthomosaic of the preburn area, and links to further information.more » « less
An official website of the United States government

Full Text Available